The alternating direction method of multipliers (ADMM) has been successfullyapplied to solve structured convex optimization problems due to its superiorpractical performance. The convergence properties of the 2-block ADMM have beenstudied extensively in the literature. Specifically, it has been proven thatthe 2-block ADMM globally converges for any penalty parameter $\gamma>0$. Inthis sense, the 2-block ADMM allows the parameter to be free, i.e., there is noneed to restrict the value for the parameter when implementing this algorithmin order to ensure convergence. However, for the 3-block ADMM, Chen \etal\cite{Chen-admm-failure-2013} recently constructed a counter-example showingthat it can diverge if no further condition is imposed. The existing results onstudying further sufficient conditions on guaranteeing the convergence of the3-block ADMM usually require $\gamma$ to be smaller than a certain bound, whichis usually either difficult to compute or too small to make it a practicalalgorithm. In this paper, we show that the 3-block ADMM still globallyconverges with any penalty parameter $\gamma>0$ if the third function $f_3$ inthe objective is smooth and strongly convex, and its condition number is in$[1,1.0798)$, besides some other mild conditions. This requirement covers animportant class of problems to be called regularized least squaresdecomposition (RLSD) in this paper.
展开▼